LLM Service icon
LLM Service
To automate your workflows, you need to sign an account first. It’s free.
Doc

This is a library for integrating LLM Service in your flow function for flows.network.

A tiny example.

use flowsnet_platform_sdk::logger;
use lambda_flows::{request_received, send_response};
use llmservice_flows::{
    chat::ChatOptions,
    LLMServiceFlows,
};
use serde_json::Value;
use std::collections::HashMap;

#[no_mangle]
#[tokio::main(flavor = "current_thread")]
pub async fn run() {
    logger::init();
    request_received(handler).await;
}

async fn handler(_qry: HashMap<String, Value>, body: Vec<u8>) {
    let co = ChatOptions {
      model: Some("gpt-4"),
      token_limit: 8192,
      ..Default::default()
    };

    // The url should be the prefix of the specific API
    // Which means `/chat/completions` for chatting and
    // `/embeddings` for creating embeddings
    // will be appended automatically.
    let mut lf = LLMServiceFlows::new("https://api.openai.com/v1");
    lf.set_api_key("your api key");

    let r = match lf
        .chat_completion(
            "any_conversation_id",
            String::from_utf8_lossy(&body).into_owned().as_str(),
            &co,
        )
        .await
    {
        Ok(c) => c.choice,
        Err(e) => e,
    };

    send_response(
        200,
        vec![(
            String::from("content-type"),
            String::from("text/plain; charset=UTF-8"),
        )],
        r.as_bytes().to_vec(),
    );
}

This example lets you have a conversation with ChatGPT using chat_completion by Lambda request.

The whole document is here.